Search results for "Model of computation"

showing 8 items of 8 documents

Models of Computation, Riemann Hypothesis, and Classical Mathematics

1998

Classical mathematics is a source of ideas used by Computer Science since the very first days. Surprisingly, there is still much to be found. Computer scientists, especially, those in Theoretical Computer Science find inspiring ideas both in old notions and results, and in the 20th century mathematics. The latest decades have brought us evidence that computer people will soon study quantum physics and modern biology just to understand what computers are doing.

Classical mathematicsFinite-state machineComputer sciencebusiness.industryModel of computationEpistemologyPhilosophy of computer sciencePhilosophy of languageTuring machinesymbols.namesakeRiemann hypothesisFormal languagesymbolsArtificial intelligencebusiness
researchProduct

On finding common neighborhoods in massive graphs

2003

AbstractWe consider the problem of finding pairs of vertices that share large common neighborhoods in massive graphs. We prove lower bounds on the resources needed to solve this problem on resource-bounded models of computation. In streaming models, in which algorithms can access the input only a constant number of times and only sequentially, we show that, even with randomization, any algorithm that determines if there exists any pair of vertices with a large common neighborhood must essentially store and process the input graph off line. In sampling models, in which algorithms can only query an oracle for the common neighborhoods of specified vertex pairs, we show that any algorithm must …

CombinatoricsGeneral Computer ScienceModel of computationExistential quantificationGraphOracleOff lineComputer Science(all)Theoretical Computer ScienceVertex (geometry)MathematicsTheoretical Computer Science
researchProduct

New results for finding common neighborhoods in massive graphs in the data stream model

2008

AbstractWe consider the problem of finding pairs of vertices that share large common neighborhoods in massive graphs. We give lower bounds for randomized, two-sided error algorithms that solve this problem in the data-stream model of computation. Our results correct and improve those of Buchsbaum, Giancarlo, and Westbrook [On finding common neighborhoods in massive graphs, Theoretical Computer Science, 299 (1–3) 707–718 (2004)]

Data streamDiscrete mathematicsGeneral Computer ScienceExtremal graph theorySpace lower boundsModel of computationCommunication complexityGraph theoryUpper and lower boundsTheoretical Computer ScienceExtremal graph theoryCombinatoricsGraph algorithms for data streamsAlgorithms Theoretical Computer SciencedGraph algorithmsCommunication complexityComputer Science(all)MathematicsTheoretical Computer Science
researchProduct

Span-Program-Based Quantum Algorithms for Graph Bipartiteness and Connectivity

2016

Span program is a linear-algebraic model of computation which can be used to design quantum algorithms. For any Boolean function there exists a span program that leads to a quantum algorithm with optimal quantum query complexity. In general, finding such span programs is not an easy task. In this work, given a query access to the adjacency matrix of a simple graph G with n vertices, we provide two new span-program-based quantum algorithms:an algorithm for testing if the graph is bipartite that uses $$On\sqrt{n}$$ quantum queries;an algorithm for testing if the graph is connected that uses $$On\sqrt{n}$$ quantum queries.

Discrete mathematicsComputer scienceExistential quantificationModel of computationTheoryofComputation_GENERALComputerSystemsOrganization_MISCELLANEOUSBipartite graphGraph (abstract data type)Quantum algorithmAdjacency matrixBoolean functionQuantumComputer Science::DatabasesMathematicsofComputing_DISCRETEMATHEMATICS
researchProduct

Understanding Quantum Algorithms via Query Complexity

2017

Query complexity is a model of computation in which we have to compute a function $f(x_1, \ldots, x_N)$ of variables $x_i$ which can be accessed via queries. The complexity of an algorithm is measured by the number of queries that it makes. Query complexity is widely used for studying quantum algorithms, for two reasons. First, it includes many of the known quantum algorithms (including Grover's quantum search and a key subroutine of Shor's factoring algorithm). Second, one can prove lower bounds on the query complexity, bounding the possible quantum advantage. In the last few years, there have been major advances on several longstanding problems in the query complexity. In this talk, we su…

Discrete mathematicsFOS: Computer and information sciencesQuantum PhysicsComputer scienceModel of computationSubroutineComputer Science::Information RetrievalFOS: Physical sciencesFunction (mathematics)Computational Complexity (cs.CC)Symmetric functionComputer Science - Computational ComplexityBounding overwatchPartial functionKey (cryptography)Quantum algorithmQuantum Physics (quant-ph)Computer Science::Databases
researchProduct

A Survey of Continuous-Time Computation Theory

1997

Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuous-time computation. However, while special-case algorithms and devices are being developed, relatively little work exists on the general theory of continuous- time models of computation. In this paper, we survey the existing models and results in this area, and point to some of the open research questions. Final Draft peerReviewed

Discrete mathematicsTheoretical computer scienceComputabilityComputationModel of computationneuraalilaskentaTuring machineTuring machinesymbols.namesakeModels of neural computationComputable functionOpen researchTheory of computationsymbolsHopfield networkcellular automatondifferential analyzerMathematics
researchProduct

AMCAS: Advanced Methods for the Co-Design of Complex Adaptive Systems

2006

Abstract This work proposes a new approximation to design and program Complex Adaptive Systems (CAS), these systems comprise neural network, intelligent agents, genetic algorithms, support vector machines and artificial intelligence systems in general. Due to the complexity of such systems, it is necessary to build a design environment able to ease the design work, allowing reusability and easy migration to hardware and/or software. Ptolemy II is used as the base system to simulate and evaluate the designs with different Models of Computation so that an optimum decision about the hardware or software implementation platform can be taken.

Hardware architectureSystem of systemsComputer sciencebusiness.industryModel of computationDistributed computingcomputer.software_genreIntelligent agentSoftwareComputer engineeringSystems development life cycleSystems designHardware compatibility listbusinesscomputerReusability
researchProduct

Classical and Quantum Computations with Restricted Memory

2018

Automata and branching programs are known models of computation with restricted memory. These models of computation were in focus of a large number of researchers during the last decades. Streaming algorithms are a modern model of computation with restricted memory. In this paper, we present recent results on the comparative computational power of quantum and classical models of branching programs and streaming algorithms.

Theoretical computer scienceComputer scienceComputationModel of computation010102 general mathematicsHash function0102 computer and information sciences01 natural sciencesAutomatonBranching (version control)010201 computation theory & mathematics0101 mathematicsStreaming algorithmQuantumQuantum computer
researchProduct